Search Results: "karl"

12 April 2014

Russ Allbery: Accumulated haul

Wow, it's been a while since I've done this. In part because I've not had much time for reading books (which doesn't prevent me from buying them). Jared Bernstein & Dean Baker Getting Back to Full Employment (non-fiction)
James Coughtrey Six Seconds of Moonlight (sff)
Philip J. Davis & Reuben Hersh The Mathematical Experience (non-fiction)
Debra Dunbar A Demon Bound (sff)
Andy Duncan & Ellen Klages Wakulla Springs (sff)
Dave Eggers & Jordan Bass The Best of McSweeny's (mainstream)
Siri Hustvedt The Blazing World (mainstream)
Jacqueline Koyanagi Ascension (sff)
Ann Leckie Ancillary Justice (sff)
Adam Lee Dark Heart (sff)
Seanan McGuire One Salt Sea (sff)
Seanan McGuire Ashes of Honor (sff)
Seanan McGuire Chimes at Midnight (sff)
Seanan McGuire Midnight Blue-Light Special (sff)
Seanan McGuire Indexing (sff)
Naomi Mitchinson Travel Light (sff)
Helaine Olen Pound Foolish (non-fiction)
Richard Powers Orfeo (mainstream)
Veronica Schanoes Burning Girls (sff)
Karl Schroeder Lockstep (sff)
Charles Stross The Bloodline Feud (sff)
Charles Stross The Traders' War (sff)
Charles Stross The Revolution Trade (sff)
Matthew Thomas We Are Not Ourselves (mainstream)
Kevin Underhill The Emergency Sasquatch Ordinance (non-fiction)
Jo Walton What Makes This Book So Great? (non-fiction) So, yeah. A lot of stuff. I went ahead and bought nearly all of the novels Seanan McGuire had out that I'd not read yet after realizing that I'm going to eventually read all of them and there's no reason not to just own them. I also bought all of the Stross reissues of the Merchant Princes series, even though I had some of the books individually, since I think it will make it more likely I'll read the whole series this way. I have so much stuff that I want to read, but I've not really been in the mood for fiction. I'm trying to destress enough to get back in the mood, but in the meantime have mostly been reading non-fiction or really light fluff (as you'll see from my upcoming reviews). Of that long list, Ancillary Justice is getting a lot of press and looks interesting, and Lockstep is a new Schroeder novel. 'Nuff said. Kevin Underhill is the author of Lowering the Bar, which you should read if you haven't since it's hilarious. I'm obviously looking forward to that. The relatively obscure mainstream novels here are more Powell's Indiespensible books. I will probably cancel that subscription soon, at least for a while, since I'm just building up a backlog, but that's part of my general effort to read more mainstream fiction. (I was a bit disappointed since there were several months with only one book, but the current month finally came with two books again.) Now I just need to buckle down and read. And play video games. And do other things that are fun rather than spending all my time trying to destress from work and zoning in front of the TV.

7 April 2014

Russ Allbery: Review: Fantasy & Science Fiction, September/October 2011

Review: Fantasy & Science Fiction, September/October 2011
Editor: Gordon van Gelder
Issue: Volume 121, No. 3 & 4
ISSN: 1095-8258
Pages: 258
Another review of a magazine that I finished quite some time ago. Apologies for any inaccuracies or lack of depth in the reviews. There wasn't much in Charles de Lint's reviews in this issue that interested me, but Michelle West covers a great selection of books. Two of them (The Wise Man's Fear and The Quantum Thief) are already on my to-read list; the third, The Postmortal, sounded interesting and would go on my list to purchase if I didn't already have so many good books I've not read. Otherwise, this issue is short on non-fiction. The only other essay entry is a film review from Kathi Maio, which is the typical whining about all things film that F&SF publishes. "Rutger and Baby Do Jotenheim" by Esther M. Friesner: Baby is a former pole dancer with a toy poodle named Mister Snickers, which warns you right away that this story is going to involve a few over-the-top caricatures and more use of the word "piddle" than one might ideally want. Rutger is a mythology professor who tolerates her for the standard reasons in this sort of pairing. They're travelling across country to Baby's sister's wedding when their car breaks down in Minnesota, prompting an encounter with frost giants. As you might expect, this is a sort of fractured fairy tale, except based on Norse mythology instead of the more typical Grimm fare. The fun is in watching these two apparent incompetents (but with enough knowledge of mythology to clue in the reader) reproduce the confrontation between Thor and Utgard-Loki. The fight with old age is particularly entertaining. If you've read any of Friesner's other stories, you know what to expect: not much in the way of deeper meaning, but lots of fun playing with stereotypes and an optimistic, funny outcome. Good stuff. (7) "The Man Inside Black Betty" by Sarah Langan: This story comes with a mouthful of a subtitle: "Is Nicholas Wellington the World's Best Hope?" It's also a story that purports to be written by a fictional character, in this case one Saurub Ramesh (with Langan credited as having done "research"). It's told in the style of first-person journalism, relating the thoughts and impressions of Ramesh as he interviews Nicholas Wellington. The topic is Black Betty: a black hole above Long Island Sound. Wellington is a scientific genius and iconoclast with radical theories of black holes that contradict how the government has been attempting to deal with Black Betty, unsuccessfully. The structure here was well-handled, reminding me a lot of a Michael Lewis article during the financial collapse. Langan has a good feel for how journalism of this type mixes personalities, politics, and facts. But it's all setup and no story. We get some world building, and then it's over, with no resolution except pessimism. Meh. (4) "A Borrowed Heart" by Deborah J. Ross: Ross starts with the trappings of urban fantasy transplanted into a Victorian world: supernatural creatures about, a protagonist who is a high-class prostitute, and sex and a sucubus by the second page. It evolves from there into a family drama and an investigation, always giving the reader the impression that a vampire will jump out at any moment. But the ending caught me entirely by surprise and was far more effective due to its departure from the expected path. Well done. (7) "Bright Moment" by Daniel Marcus: The conflict between terraforming and appreciation for the universe as we find it is an old story pattern in science fiction, and Marcus doesn't add much here. I think the story would have been stronger if he'd found a way to write the same plot with a pure appeal to environmental beauty without the typical stakes-raising. But he does sprinkle the story with a few interesting bits, including a pod marriage and a futuristic version of extreme sports as a way of communing with nature. (6) "The Corpse Painter's Masterpiece" by M. Rickert: This is typical of my reaction to a Rickert story: shading a bit too much towards horror for me, a bit too cryptic, well-written but not really my thing. It's about a corpse painter who does the work of an informal mortician, improving the appearance of bodies for their funerals, and the sheriff who brings him all the dead bodies. It takes an odd macabre twist, and I have no idea what to make of the ending. (4) "Aisle 1047" by Jon Armstrong: Armstrong is best known for a couple of novels, Grey and Yarn, which entangle their stories in the future of marketing and commerce. One may be unsurprised, then, that this short story is on similar themes, with the intensity turned up to the parody point. Tiffan3 is a department-store saleswoman, spouting corporate slogans and advertising copy while trying to push customers towards particular products. The story follows the escalation into an all-out brand war, fought with the bubbly short-cut propaganda of a thirty-second commercial. For me, it fell awkwardly between two stools: it's a little too over-the-top and in love with its own bizarre alternate world to be effective satire, but the world is more depressing than funny and the advertising copy is grating. More of a curiosity than a successful story, I think. (5) "Anise" by Chris DeVito: Stories that undermine body integrity and focus on the fascinated horror of violation of physical boundaries aren't generally my thing, so take that into account in this review. Anise's husband died, but that's not as much of a problem as it used to be. Medical science can resurrect people via a sort of permanent, full-body life support system, making them more cyborg than human. "Anise" is about the social consequences of this technology in a world where a growing number of people have a much different relationship with their body than the typical living person today. It's a disturbing story that is deeply concerned with the physical: sex, blood, physical intimacy in various different forms, and a twisted type of psychological abuse. I think fans of horror will like this more than I did, although it's not precisely horror. It looks at the way one's perception of self and others can change by passing through a profound physical transformation. (5) "Spider Hill" by Donald Mead: I liked this story a lot better. It's about witchcraft and farm magic, about family secrets, and a sort of coming-of-age story (for a girl rather than a boy, for once). The main character is resourceful, determined, but also empathetic and aware of the impact of her actions, which made her more fun to read about. I doubt I'll remember this for too long, but when skimming through it again for a review, I had fond memories of it. (6) "Where Have All the Young Men Gone?" by Albert E. Cowdrey: Cowdrey in his paranormal investigation mode, which I like better than his horror mode. For once, the protagonist isn't even a lower-class or backwoods character. Instead, he's a military historian travelling in Austria who runs across a local ghost story. This is a fairly straightforward ghost investigation that follows a familiar path (albeit to an unusual final destination), but Cowdrey is a good story-teller and I liked the protagonist. (7) "Overtaken" by Karl Bunker: This is the sort of story that delivers its moral with the force of a hammer. It's not subtle. But if you're in the right mood for that, it's one of the better stories of its type. It's about a long-journey starship, crew in hibernation, that's overtaken by a far newer and faster mechanized ship from Earth that's attempting to re-establish contact with the old ships. The story is a conversation between the ship AIs. Save this one until you're in the mood for an old-fashioned defense of humanity. (8) "Time and Tide" by Alan Peter Ryan: Another pseudo-horror story, although I think it's better classified as a haunting. A wardrobe recalls a traumatic drowning in the childhood of the protagonist. As these things tend to do in stories like this, reality and memory start blurring and the wardrobe takes on a malevolent role. Not my sort of thing. (3) "What We Found" by Geoff Ryman: Any new Geoff Ryman story is something to celebrate. This is a haunting story on the boundaries between the scientific method and tribal superstition, deeply entangled with the question of how one recovers from national and familial trauma. How can we avoid passing the evils and madness of one generation down to the next? Much of the story is about family trauma, told with Ryman's exceptional grasp of character, but the science is entangled in an ingenious way that I won't spoil. As with Air, this is in no way science fiction. The science here would have fascinating and rather scary implications for our world, but clearly is not how science actually works. But as an insight into politics, and into healing, I found it a startlingly effective metaphor. I loved every bit of this. By far the best story of the issue. (9) Rating: 7 out of 10

5 March 2014

Joachim Breitner: Creative use of screen-message

I just learnt that the math and computer science student body is using screen-message to power an information screen in their room:
There are five instances of screen-message, configured to use non-standard colors and a fixed-width fonts, and controlled by the rather new remote control feature, where screen-messages repeatedly keeps reading text from standard input until a form feed character (ASCII 0x0C) comes, and displays that. The window manager ratpoison takes care of tiling the instances. It looks quite a bit like a poor man s version of dividuum s great info-beamer. I was told that they also tried out info-beamer, but the Raspberry Pi was too slow and weak for that, while screen-message and ratpoison run happily on such small hardware. Correction: Info-beamer was not tried, but rather ruled out as overkill for this task.

1 March 2014

Francois Marier: Using vnc to do remote tech support over high-latency networks

If you ever find yourself doing a bit of technical support for relatives over the phone, there's nothing like actually seeing what they are doing on their computer. One of the best tools for such remote desktop sharing is vnc. Here's the best setup I have come up with so far. If you have any suggestions, please leave a comment!

Basic vnc configuration First off, you need two things: a vnc server on your relative's machine and a vnc client on yours. Thanks to vnc being an open protocol, there are many choices for both. I eventually settled on x11vnc for the server and ssvnc for the client. They are both available in the standard Debian and Ubuntu repositories. Since I have ssh access on the machine that needs to run the server, I simply login and then run x11vnc. Here's what ~/.x11vnrc contains:
noxdamage
That option appears to be necessary when the desktop to share is running gnome-shell / compiz. Afterwards, I start the client on my laptop with the following command:
ssvncviewer -encodings zrle -scale 1280x775 localhost
The scaling factor is simply the resolution of the client minus any window decorations.

ssh configuration As you can see above, the client is not connecting directly to the server. Instead it's connecting to its own vnc port (localhost:5900). That's because I'm tunelling the traffic through the ssh connection in order to avoid relying on vnc extensions for authentication and encryption. Here's what the client's ~/.ssh/config needs for that simple use case:
Host server.example.com:
  LocalForward 5900 127.0.0.1:5900
If the remote host (which has an internal IP address of 192.168.1.2 in this example) is not connected directly to the outside world and instead goes through a gateway, then your ~/.ssh/config will look like this:
Host gateway.example.com:
  ForwardAgent yes
  LocalForward 5900 192.168.1.2:5900
Host server.example.com:
  ProxyCommand ssh -q -a gateway.example.com nc -q0 %h 22
and the remote host will need to open up a port on its firewall for the gateway (internal IP address of 192.168.1.1 here):
iptables -A INPUT -p tcp --dport 5900 -s 192.168.1.1/32 -j ACCEPT

Optimizing for high-latency networks Since I do most of my tech support over a very high latency network, I tweaked the default vnc settings to reduce the amount of network traffic. I added this to ~/.x11vncrc on the vnc server:
ncache 10
ncache_cr
and changed the client command line to this:
ssvncviewer -compresslevel 9 -quality 3 -bgr233 -encodings zrle -use64 -scale 1280x775 -ycrop 1024 localhost
This decreases image quality (and required bandwidth) and enables client-side caching. The magic 1024 number is simply the full vertical resolution of the remote machine, which sports a vintage 1280x1024 LCD monitor.

26 February 2014

Vincent Sanders: There are only two hard things in Computer Science: cache invalidation and naming things.

It is the first of these which I have recently been attempting and I think Phil Karlton might have a good point.
What are we talking about?Web browsers are pretty complex beasties but the basic concept is pretty easy to understand. They fetch a bunch of files that make up a web page and render those source files into something suitable for human consumption.

It is also intuitive that fetching all the files that make up a web page, every time you browse to a new page, might be wasteful, especially if most of the files had not changed from the previous page.

In order to address this browsers hang onto copies of these files in case your browsing needs them again, this is known as a source file cache. Caches are a widely used technique throughout many aspects of computer technology but the basic idea is that they trade one resource for another.

In this case we are trading local storage space (memory or disc) for access time. This type of trade may give large benefits due to the large differences in access times (sometimes known as latency) between local and network resources.

For example, if the source files that make up a web page are a Megabyte in size accessed with a 2013 era PC with a 10Mbit connection to the Internet:
From this example it becomes startlingly obvious why a source cache is so desirable. I have greatly simplified the browsers operation here as they usually implement many layers of additional caching in other areas to make the browsing experience subjectively smoother.

And before the observant reader suggests that we could just make the network faster it ought to be mentioned that due to fundamental physical constraints, like the speed of light, it would be unlikely that the average practical network reply will ever drop much below 150,000 s [1] or some 40 times slower than disc. Still a worthwhile gain.
Knowing the cost of everything and the value of nothingHaving shown the benefits it might be nice to think there are no downsides to caching, this is not the case, as I mentioned we are trading storage for time and as in any trade one must expect there to be overheads.

In the case of a browser we face several obstacles. Firstly not every file that goes to make up a web page can be cached. The HTTP specification contains an entire section on caching which contains numerous rules and operations to ensure that only objects that should be cached are and determines for how long they can be kept before going "stale".

A great deal of this complexity comes from just how desirable caching is to network providers. Many ISPs will have web proxy servers to cache requests for all users to reduce their bandwidth usage, or increasingly, to implement content filtering according to the local government censorship rules. The browsers cache must know how to interact with these proxies without getting erroneous results.

Then comes the problem alluded to in the section title. A browsers cache cannot grow indefinitely. Eventually a system would run out of memory and disc if a browser kept every file. To deal with this the cache is limited sometimes by the number of files it holds, but more commonly by size both in memory and on disc.

The caching structure used by a web browser is hierarchical in that it will first evict (move) files from memory to a backing store (disc) and when the disc cache size limit is exceeded files are evicted to the next tier. Which in this case, will involve deleting the local copy (invalidating) and requiring performing a fetch from the network if the file is needed again.

Tardis at the Beeb by Sarah G from flikrWhen the cache size is exceeded the decision is made about what to remove from the cache using a cache strategy or algorithm. The aim of this decision is to discard the data that will not be required again for the longest time.

If we had a blue box with a madman inside, who we could persuade to bring our browsing history back from the future, implementing b l dy's algorithm might be possible. Failing a perfect solution we are reduced to making a judgement based on the information available to us.

This involves computing a cost for replacing each entry in the cache and evicting those with the lowest values. The selected algorithm has a large impact on the effectiveness of the cache and often needs fine tuning for the task at hand.
Never has so little been measured so muchHaving established that caching is useful and that we need to make decisions on what to keep it follows that those decisions must be based on information. This information consists of two main groups:
Maintaining all of this information is a significant part of the overheads involved with caching and a compromise must be struck between having the necessary information to make good caching choices and the cost maintaining that data.
Cache metricsMost cache implementations maintain at least some basic metrics, at the very least how much storage is being used so a decision on if evicting entries from the cache to reclaim space is necessary.

It is also common to keep a record of how well the cache is performing. The cache efficiency (the hit rate) is usually measured as the ratio of successful accesses provided from the cache (a cache hit) to the total number of requests to the cache.

A perfect cache would have all hits and no misses (a hit rate of 1) while using as little storage as possible (but as already noted that requires a special madman). The hit rate can be used to automatically change the behaviour of the eviction algorithm, perhaps using more storage if the hit rate drops too low or changing the eviction frequency.
MetadataEach object must carry with it the data necessary to regain its state when retrieved from the cache. This information is essential for the correct operation of the cache but in of itself provides no direct value.

In the case of browser source data this includes all the headers sent in the original network fetch. These headers are themselves just metadata sent along with the object by the web server which must be preserved.

Two particularly useful value for making a good eviction decision are how long it has been since the object was last used and how many times it has been used. This is because an object that was cached a long time ago and then never used again is much less valuable than one that has recently been repeatedly used.
An ImplementationIt is all very well talking about the general solution but a full understanding can only be gained by examining a real implementation. The NetSurf cache was chosen as it is a suitably small amount of code but still demonstrates all the major design features.
InterfaceThe cache deals with all source data and provides the browser with a single unified object request interface regardless of if the source data can be cached or not. This is useful to hide the implementation details allowing for changes without affecting the rest of the browsers code.

The implementation effectively consists of two lists of objects, those that can be cached and those that cannot. Each object on these lists is reference counted against one or more users and while there is at least one user the source data is maintained in memory.

Objects determined to be non cached are always directly fetched from their source and are immediately released once they have no users remaining.

For cacheable objects an attempt is made to fulfil the request in descending order:
  1. from objects already in memory.
  2. from the backing store.
  3. from a network fetch.
Irrespective of the source of the data the remaining operations on the object (determining freshness etc.) are exactly as if the request had been fulfilled from the network. This approach means the use of the cache is transparent to the object users.
WriteoutObjects are placed in the backing store asynchronously to any other operations. When an object has been fetched from the network a background task is scheduled for when the browser is otherwise idle.

This writeout task constructs a list of all cacheable objects not yet in the backing store, associates a cost value with each object and then proceeds to write those objects to the backing store in highest to lowest value order.
CleaningA periodically scheduled task deals with ensuring the cache is cleaned. This consists of destroying stale objects and arranging fresh objects are not using more memory than the configuration permits.

Memory usage is reduced within the cleaning task by discarding the source data for objects already held in the backing store (where the data can be retrieved relatively cheaply)

Additional memory may be recovered by simply discarding unused objects held only in memory. These objects are usually those which have only recently become unused otherwise the writeout task would have committed them to the backing store.

The most recently used objects are statistically likely to be used again soon, because of this there is a high risk of a cache miss associated with discarding these objects. In order to mitigate this undesirable effect the cleaning heuristic favours using more memory than configured for short periods rather than discarding from memory.
Backing storeThe backing store is a simple key-value database. Each source data object and its metadata (its headers etc.) is associated with a unique key (the URL used to fetch it). The backing store is simply required to store and retrieve the object data associated with the given URL.

The backing store implementation in NetSurf is pluggable, this flexibility is required for dealing with the greatly varying capabilities and limits of the various systems the browser is required to execute on.

A novel aspect of this backing store interface is that if an implementation returns the wrong object it is not considered an error and is simply treated as a cache miss. This is possible because the metadata contains the URL of the stored object enabling object verification.

This behaviour is permitted because techniques which significantly improve key-value store performance (principally key hashing) become available if they are not required to always give the correct answer.

The backing store is also required to manage its size within the configured limits and deal with any filesystem behaviour details.

The reference backing store is trivial in that it performs a hash operation on the input URL, encodes the result with base64url encoding and uses that as the object filename. The length of the hash can be configured allowing use of the reference implementation in any situation where using the filesystem as a database is acceptable.

The reference backing store default hash is SHA-1 yielding almost unique [2] 160 bit key values stored in much the same way as the git DVCS. Note we are not using this hash for its cryptographic properties and at worst a collision will result in the cache being a tiny amount less efficient.
ConclusionHopefully this has shown what a browser source file cache is, why it is useful and a basic understanding of how they are implemented.

I will admit I have glossed over some of the more challenging aspects especially in relation to actually implementing the cache strategy but I hope that the reader will forgive those omissions of detail in a quest for a little more clarity of the general principles.

I would like to thank John-Mark Bell for implementing the NetSurf cache and to Melodie Parry for proof reading and providing feedback.

[1] Speed of light is 3x10^8m/s earth circumference is 4x10^7m, divide circumference by speed gives 133,000 s which gives longest round trip time from anywhere on surface of earth to point furthest from. So assuming a whole megabyte can be transferred in one round trip and allowing for some overhead the lower bound estimate of 150,000 s looks reasonable.

[2] Mathematically speaking this hash would allow us to say that with the current estimated 1x10^11 URLs on the internet the likelihood of a collision is still vanishingly small (at least 1x10^-15)

3 October 2013

Joachim Breitner: Experimenting with the TI eZ430 Chronos

For a while I have been eyeing with the TI eZ430 Chronos watch, a slightly bulky sports watch that not only comes with a built-in thermometer, barometer and accelerometer, but also a freely programmable micro controller (from the TI MSP430 family hence the name). In addition it has a small radio chip included and can communicate wirelessly with the computer, or possibly other devices. All in all, a very geeky toy. A week ago I managed to buy one quite cheaply via eBay (30 including shipping). It took me a while to get an overview of the various resources about the device. Here is what I have learned. After I found that out, I tried to hack a bit on the various software. The first issue I had (and still have) is that I cannot use the buttons of the device while it is attached to the debug board. They do work if forget to take out the battery, but supposedly that is bad for the battery. So my compile-upload-test-cycle becomes a compile-upload-detach-insert-battery-test-remove-battery-attach-cycle. I could not find out how to emulate button presses using mspdebug. Any suggestions are welcome. The other problem was the new altimeter hardware: Neither OpenChronos nor openchronos-ng support that yet. I originally planned to port the driver from the official software to a random OpenChronos fork when I read that Richard Rondu did that to openchronos-ng yet. I found the code in a month-old merge-request. I tried it, it worked, and I started to clean up the code a bit and add a variometer mode to it, e.g. displaying the current rate of climb my ultimate goal is to use this as a surprisingly cheap variometer for paragliding. After a day of hacking and quickly lifting the watch from the room floor to the room ceiling and back I believe that it works okish, although quite a few parameter can benefit from more tweaking: sample rate (currently 20Hz), length of history (currently 2s) and the linear regression method (currently simply the difference between the average of the samples of the current second with the samples of the last second). As always, patches welcome, the code is in my fork on sf.net.
Next to do would be to enable some buzzing noise depending on the rate of change; during flight I don t want to decipher a low-contract LCD screen. But it is questionable whether the buzzer is loud enough we ll see. I was also happy to hear that there are other zE430-hackers in Karlsruhe, one of them for example tries to make the alarm clock sleep-cycle-aware.

16 July 2013

Joachim Breitner: Real World Haskell Applications

Today I held a lightning talk at the Karlsruhe Functional Programmer s Group, where I presented Real World Haskell applications, i.e. programs that are written in Haskell, are somewhat mature and well-known and have non-programmers as their target group. Here is the (surely subjective) list that I have come up (also see the slides): There is also an impressive list of industrial users that use Haskell without talking about it much, but that does not work well to show off that you can do any kind of application in Haskell. If I missed anything important, let me know!

22 January 2013

Russ Allbery: Review: Fantasy & Science Fiction, March/April 2011

Review: Fantasy & Science Fiction, March/April 2011
Editor: Gordon van Gelder
Issue: Volume 120, No. 3 & 4
ISSN: 1095-8258
Pages: 258
Charles de Lint's book review column sticks with the sorts of things he normally reviews: urban and contemporary fantasy and young adult. Predictably, I didn't find that much of interest. But I was happy to see that not all the reviews were positive, and he talked some about how a few books didn't work. I do prefer seeing a mix of positive and negative (or at least critical) reviews. James Sallis's review column focuses entirely on Henry Kuttner and C.L. Moore (by way of reviewing a collection). I'm always happy to see this sort of review. But between that and de Lint's normal subject matter, this issue of F&SF was left without any current science fiction reviews, which was disappointing. Lucius Shepard's movie review column features stunning amounts of whining, even by Shepard's standards. The topic du jour is how indie films aren't indie enough, mixed with large amounts of cane-shaking and decrying of all popular art. I find it entertaining that the F&SF film review column regularly contains exactly the sort of analysis that one expects from literary gatekeepers who are reviewing science fiction and fantasy. Perhaps David Langford should consider adding an "As We See Others" feature to Ansible cataloging the things genre fiction fans say about popular movies. "Scatter My Ashes" by Albert E. Cowdrey: The protagonist of this story is an itinerant author who has been contracted to write a family history (for $100K, which I suspect is a bit of tongue-in-cheek wish fulfillment) and has promptly tumbled into bed with his employer. But he is somewhat serious about the writing as well, and is poking around in family archives and asking relatives about past details. There is a murder (maybe) in the family history, not to mention some supernatural connections. Those familiar with Cowdrey's writing will recognize the mix of historical drama, investigation, and the supernatural. Puzzles are, of course, untangled, not without a bit of physical danger. Experienced fantasy readers will probably guess at some of the explanation long before the protagonist does. Like most Cowdrey, it's reliably entertaining, but I found it a bit thin. (6) "A Pocketful of Faces" by Paul Di Filippo: Here's a bit of science fiction, and another mystery, this time following the basic model of a police procedural. The police in this case are enforcing laws around acceptable use of "faces" in a future world where one can clone someone's appearance from their DNA and then mount it on a programmable android. As you might expect from that setup, the possibilities are lurid, occasionally disgusting, and inclined to give the police nightmares. After some scene-setting, the story kicks in with the discovery of the face of a dead woman who, at least on the surface, no one should have any motive to clone. There were a few elements of the story that were a bit too disgusting for me, but the basic mystery plot was satisfying. I thought the ending was a let-down, however. Di Filippo tries to complicate the story and, I thought, went just a little too far, leaving motives and intent more confusing than illuminating. (6) "The Paper Menagerie" by Ken Liu: Back to fantasy, this time using a small bit of magic to illustrate the emotional conflicts and difficulties of allegiance for second-generation immigrants. Jack is the son of an American farther and a Chinese mother who was a mail-order bride. He's young at the start of the story and struggling with the embarassment and humiliation that he feels at his mother's history and the difficulties he has blending in with other kids, leading to the sort of breathtaking cruelty that comes so easily from teenagers who are too self-focused and haven't yet developed adult empathy. I found this very hard to read. The magic is beautiful, personal, and very badly damaged by the cruelty in ways that can never really be fixed. It's a sharp reminder of the importance of being open-hearted, but it's also a devastating reminder that the lesson is normally learned too late. Not the story to read if you're prone to worrying about how you might have hurt other people. (6) "The Evening and the Morning" by Sheila Finch: This long novella is about a third of the issue and is, for once, straight science fiction, a somewhat rare beast in F&SF these days. It's set in the far future, among humans who are members of the Guild of Xenolinguists and among aliens called the Venatixi, and it's about an expedition back to the long-abandoned planet of Earth. I had a lot of suspension of disbelief problems with the setup here. While Earth has mostly dropped out of memory, there's a startling lack of curiosity about its current condition among the humans. Finch plays some with transportation systems and leaves humanity largely dependent on other races to explain the failure to return to Earth, but I never quite bought it. It was necessary to set up the plot, which is an exploration story with touches of first contact set on an Earth that's become alien to the characters, but it seemed remarkably artificial to me. But, putting that aside, I did get pulled into the story. Its emotional focus is one of decline and senescence, a growing sense of futility, that's halted by exploration, mystery, and analysis. The question of what's happened on Earth is inherently interesting and engaging, and the slow movement of the story provides opportunities to build up to some eerie moments. The problem, continuing a theme for this issue, is the ending. Some of the reader's questions are answered, but most of the answers are old, well-worn paths in science fiction. The emotional arc of the story is decidedly unsatisfying, at least to me. I think I see what Finch was trying to do: there's an attempted undermining of the normal conclusion of this sort of investigation to make a broader point about how to stay engaged in the world. But it lacked triumph and catharsis for me, partly because the revelations that we get are too pedestrian for the build-up they received. It's still an interesting story, but I don't think it entirely worked. (6) "Night Gauntlet" by Walter C. DeBill, Jr., et al.: The full list of authors for this story (Walter C. DeBill, Jr., Richard Gavin, Robert M. Price, W.H. Pugmire, Jeffrey Thomas, and Don Webb) provides one with the first clue that it's gone off the rails. Collaborative storytelling, where each author tries to riff off the work of the previous author while spinning the story in a different direction, is something that I think works much better orally, particularly if you can watch facial expressions while the authors try to stump each other. In written form, it's a recipe for a poorly-told story. That's indeed what we get here. The setup is typical Cthulhu mythos stuff: a strange scientist obsessed with conspiracy theories goes insane, leaving behind an office with a map of linkages between apparently unrelated places. The characters in the story also start going insane for similar reasons, leading up to a typical confrontation with things man was not meant to know, or at least pay attention to. If you like that sort of thing, you may like this story better than I did, but I thought it was shallow and predictable. (3) "Happy Ending 2.0" by James Patrick Kelly: More fantasy, this time of the time travel variety. (I call it fantasy since there's no scientific explanation for the time travel and it plays a pure fantasy role in the story.) That's about as much as I can say without giving away the plot entirely (it's rather short). I can see what Kelly was going for, and I think he was largely successful, but I'm not sure how to react to it. The story felt like it reinforced some rather uncomfortable stereotypes about romantic relationships, and the so-called happy ending struck me as the sort of situation that was going to turn very nasty and very uncomfortable about five or ten pages past where Kelly ended the story. (5) "The Second Kalandar's Tale" by Francis Marion Soty: The main question I have about this story is one that I can't answer without doing more research than I feel like doing right now: how much of this is original to Soty and how much if it is straight from Burton's translation of One Thousand and One Nights. Burton is credited for the story, so I suspect quite a lot of this is from the original. Whether one would be better off just reading the original, or if Soty's retelling adds anything substantial, are good questions that I don't have the background to answer. Taken as a stand-alone story, it's not a bad one. It's a twisting magical adventure involving a djinn, a captive woman, some rather predictable fighting over the woman, and then a subsequent adventure involving physical transformation and a magical battle reminiscent of T.H. White. (Although I have quite likely reversed the order of inspiration if as much of this is straight from Burton as I suspect.) Gender roles, however, are kind of appalling, despite the presence of a stunningly powerful princess, due to the amount of self-sacrifice expected from every woman in the story. Personally, I don't think any of the men in the story are worth anywhere near the amount of loyalty and bravery that the women show. Still, it was reasonably entertaining throughout, in exactly the way that I would expect a One Thousand and One Nights tale to be. Whether there's any point in reading it instead of the original is a question I'll leave to others. (7) "Bodyguard" by Karl Bunker: This is probably the best science fiction of the issue. The first person protagonist is an explorer living with an alien race, partly in order to flee the post-singularity world of uploaded minds and apparent stagnation that Earth has become. It's a personal story that uses his analysis of alien mindsets (and his interaction with his assigned alien bodyguard) to flesh out his own desires, emotional background, and reactions to the world. There are some neat linguistic bits here that I quite enjoyed, although I wish they'd been developed at even more length. (The alien language is more realistic than it might sound; there are some human languages that construct sentences in a vaguely similar way.) It's a sad, elegiac story, but it grew on me. (7) "Botanical Exercises for Curious Girls" by Kali Wallace: One has to count this story as science fiction as well, although for me it had a fantasy tone because the scientific world seems to play by fantasy rules from the perspective of the protagonist. Unpacking that perspective is part of the enjoyment of the story. At the start, she seems to be a disabled girl who is being cared for by a strange succession of nurses who vary by the time of day, but as the story progresses, it becomes clear that something much stranger is going on. There are moments that capture a sense of wonder, reinforced by the persistantly curious and happy narrative voice, but both are undercut by a pervasive sense of danger and dread. This is a light story written about rather dark actions. My biggest complaint with the story is that it doesn't so much end as wander off into the sunset. It set up conflict within a claustrophobic frame, so I can understand the thematic intent of breaking free of that frame, but in the process I felt like the story walked away from all of the questions and structure that it created and ended in a place that felt less alive with potential than formless and oddly pointless. I think I wanted it to stay involved and engaged with the environment it had created. (6) "Ping" by Dixon Wragg: I probably should just skip this, since despite the table of contents billing and the full title introduction, it's not a story. It's a two-line joke. But it's such a bad two-line joke that I had to complain about it. I have no idea why F&SF bothered to reprint it. (1) "The Ifs of Time" by James Stoddard: This certainly fits with the Arabian Nights story in this issue. The timekeeper of a vast and rambling castle (think Gormenghast taken to the extreme) wanders into a story-telling session in a distant part of the castle. The reader gets to listen to four (fairly good) short stories about time, knowledge, and memory, told in four very different genres. All of this does relate to why the timekeeper is there, and the frame story is resolved by the end, but the embedded stories are the best part; each of them is interesting in a different way, and none of them outlast their welcome. This was probably the strongest story of this issue. (7) Rating: 6 out of 10

27 December 2012

Russ Allbery: Review: Science Fiction: The 101 Best Novels: 1985 2010

Review: Science Fiction: The 101 Best Novels: 1985 2010, by Damien Broderick & Paul Di Filippo
Publisher: Nonstop
Copyright: 2012
ISBN: 1-933065-39-7
Format: Trade paperback
Pages: 288
I like book reviews and lists of best novels, as a follower of my reviews probably noticed, so I couldn't resist when this collection made my radar. A follow-up to the earlier Science Fiction: The 101 Best Novels: 1949 1985 by David Pringle (which I have not read), it is a collection of short (two to three pages, generally) reviews of 101 recent SF novels. The date spread is fairly balanced: at least two novels from each year under consideration are featured, and no year gets more than seven or eight. The authors also clearly tried to cover the range of what falls under the science fiction genre, from alternate history through space opera and including novels normally marketed as mainstream, so the result should contain something to everyone's taste. With those characteristics, you may suspect that the "best" part of the title is a bit questionable, and you would be right. "101 of the better novels" would be a more accurate description. While most Hugo and Nebula winners are included here, the section is at times eclectic. But it's eclectic in the spirit of broad inclusiveness: Jumper, Temeraire, or The Hunger Games would normally not be included on this sort of list because they're too popular or "light," but they're here alongside more obscure books (at least for SF readers) like Galatea 2.2, Distance Haze, or My Dirty Little Book of Stolen Time. I doubt anyone will seriously argue that this selection should have replaced the Hugo or Nebula short lists, but one doesn't read review collections like this only to hear about books one already knows about. Those books one has either already read or already chosen not to read. The wide-ranging selection makes it likely that something here will be new to most readers. The authors, Damien Broderick and Paul Di Filippo, are known reviewers in the SF world, and I've read reviews from both of them before. I bought this book largely on the strength of Broderick's name, since I've usually enjoyed his contributions to The New York Review of Science Fiction. Paul Di Filippo was more of a gamble; he's one of the regular reviewers for Asimov's Science Fiction and not one of my favorites. But what I usually disliked about his columns was their focus on obscure small-press titles, graphic novels, and slipstream, so I was hoping that an SF review collection aiming towards the genre mainstream would be more to my taste. The result is mixed. There are things about this selection, and about the reviews, that I enjoyed, and there are other things I found quite annoying. First, the selection. I could (and will in a moment) talk about good and bad selections, but I also have a good statistical metric for at least the alignment of the authors' taste with mine. Of the 107 novels reviewed here (in several case, duologies and trilogies are given single entries), I've read 39, or a little over a third. (Note that I've read every Nebula or Hugo winner in that time period and most of the Hugo nominees, so that will give you a good feel for how broad-ranging this selection is, and how far afield of the normal award slates it goes.) My average rating for those 39 books was 7.49 (including four perfect 10s). By comparison, my average rating for Hugo winners is 6.68 and for Nebula winners is 7.10. There was one 4 (The White Queen) and one 5 (Red Mars), and in both cases I can see why they're here. Of the rest of the books I read I rated them all at least at 6 out of 10. At least among the books I've read, this seems to be a solid selection. Sometimes the details of those selections are odd, though. For example, the authors make an effort to limit the number of selections for each author, a wise choice since they're clearly going by diversity. But if one is operating within that limitation, choosing Ammonite over Slow River for Nicola Griffith, or particularly Ventus over Lady of Mazes or even Permanence for Karl Schroeder, is baffling. There were several similar places where I thought the selection for an author was obscure, minor, or just missed obvious alternatives. Perhaps this was to fill in one of the other breadth criteria, such as balancing number of novels per year or attempting to cover each subgenre. Also, if one is going to divide science fiction and fantasy and try to cover only the science fiction (a division that I think is quite difficult, which is why I don't do it, but it does have the merit of narrowing the field), including The Falling Woman is quite strange. It's a solid book, to be sure, and a Nebula winner. It is also quite straightforward contemporary fantasy involving ghosts and Mayan mythology, without a hint of science-fictional content. Making the protagonists archeologists and scientists doesn't make the book science fiction. The authors try to defend this (unpersuasively to me), but it wasn't the only instance here where I thought their line between science fiction and fantasy was a bit off. That said, there are a lot of great selections here, including books that I love but that aren't frequently picked for this sort of list (The Fortunate Fall, The Time Traveler's Wife, or China Mountain Zhang, for example). It's great to see underappreciated authors like Linda Nagata, Joan Slonczewski, and Karl Schroeder featured. But, of course, one doesn't buy this sort of book just for the list, if for no other reason than that lists aren't copyrightable and one can easily find the complete list of reviewed novels on the Internet (just one example that turned up in a search). Rather, one reads this sort of book for the reviews. And that's where this book moves onto more questionable ground. First, while I realize that everyone has different thresholds for what they consider spoilers and most professional reviewers are more cavalier about them than I am, Broderick and Di Filippo cross any line that I consider reasonable. Most of the reviews are okay, if skirting the limits, but in several places they give away key reveals of books or discuss plot twists right up to, or even including, the ending. The combined review of The Sparrow and Children of God is particularly egregious, containing unambiguous, book-destroying spoilers for The Sparrow. Giving away the ending of Ammonite is only slightly less bad. And those are just two examples I remember. This is not okay. The whole point of this sort of collection is to expose the reader to books they've not yet read but may want to. Proceeding to spoil the book for them in the course of the review is perverse. This alone would make me hesitant to recommend this collection. Second, quite a few of the reviews in this book are, for lack of a better term, emotionally overreaching. Here's an excerpt of a review picked at random (As She Climbed Across the Table by Jonathan Lethem) that will hopefully illustrate:
Lethem's beautifully balanced, metaphorically rich prose propels this blackly jolly fable to a surprising yet satisfying conclusion. By book's end, a sense that the author had accomplished his takeoff taxiing and was now fully in flight for more cosmopolitan cities pervades the pages.
What's "beautifully balanced prose"? Could you recognize it? Does that phrase communicate anything to you other than that the authors liked the book? The whole review collection is written with adjectives and metaphors like this, and after a while it all seems a bit much. It felt like the authors were straining for ways to describe how important or significant the books are and, in the process, lost sight of the basic goal of conveying information about the book. It feels overwrought rather than informative. Even if one is reviewing the books that one considers the pinnacle of achievement in science fiction, a conversational tone with concrete examples and specifics communicates more than impressive but slippery terms like "metaphorically rich." Lest I sound entirely negative, one thing that I did appreciate is that the reviews go to some effort (particularly for their short length) to put the work in the broader context of the field and within the author's oeuvre. Often there's some discussion of previous and subsequent work or related books, and the reviews that feature books from larger series provide good explanations for why those particular books were singled out. Sometimes the number of dangling references was frustrating; authors of these sorts of collections need to remember that most readers will not be as widely read, and reviewing books largely by comparison to other books runs the risk of missing the reader's knowledge entirely. But the reviews convey a real sense of SF as a broad conversation and provide a sense of the breadth and variety of themes and subgenres available. This is one of the fun explorations that this sort of catalog lets the authors and reader do together. Another, more minor, touch that I appreciated was the cover art. Each review leads with an image of the reviewed book's cover (alas, only in black and white for obvious printing reasons). But rather than taking the obvious approach of using the covers of the first releases, or the covers from a particular country, they're chosen from all of the world-wide editions in all their delightful variety. Typical artistic styles for book covers vary drastically between countries, and getting to see a sample of artwork from different markets is a treat. I want to recommend this book. It casts a much broader net than most collections of its kind and provides some needed attention to smaller corners of the genre. I was impressed by the book list before I bought it, and (with the inevitable quibbles) am even more impressed now that I've read it. Broderick and Di Filippo go out of their way to broaden the reader's horizon and open up new avenues for reading, which is one of the best things a review collection can do. But when reviewers don't avoid spoilers, I just can't recommend their work. For me, this is a cardinal sin. Combine that with a writing style that was occasionally overblown and overwritten and the merits don't quite overcome the flaws. I'm glad I read it; it got me excited about reading many books I've already purchased but not gotten to, and I got from it another slew of books to add to my to-purchase list. But I had to read it uncomfortably and lightly, constantly prepared to jump past a review that was too revealing. Rating: 6 out of 10

16 December 2012

Joachim Breitner: Circle Packing

For an upcoming introductory Haskell talk of mine (next Tuesday in Karlsruhe please come and join if you will) I was looking into data visualization as a example application. With libraries like gloss, getting some nice vector graphics up and running within one talk is quite possible. For some reason, I want to arrange multiple circles of varying size so that they do not overlap and use as little space as possible. My first approach was to use general purpose optimizers such as cmaes, Numeric.GSL.Minimization from the hmatrix package and force-layout. I especially liked the interface of cmaes, which is able to minimize a function of, say, type [(String, Double, Double)] -> Double by automatically finding the values of type Double in the input, as described by Takayuki Muranushi. That would have looked great in the talk. Unfortunately, none of these libraries gave sufficient good results in a reasonable amount of time for this problem. I then reviewed some of the academic literature on the circle packing problem and there were some algorithms proposed, but no simple one and none of the papers had a concise pseudo-code description that I could transfer to Haskell. So eventually I set out to implement a relatively dump and slow greedy algorithm myself: I place one circle after another, starting with the largest, and among the valid positions where a new circle touches two already placed circles, I choose the positions closest to the center of mass. The result, available in the circle-packing package, looks surprisingly well, here visualized using the diagrams library (See the documentation of Optimisation.CirclePacking for the code used to generate that image):
Now showing just this image is not a very good way to demonstrate my code. A few years ago, I might have created a CGI script that would dynamically generate images based on your input. But not in 2012: Since my code is pure Haskell without fancy features like type classes, I can use the fay compiler to generate JavaScript code from it. Add a little Haskell code to interact with the HTML5 canvas and now you can interactively try out circle-packing in your browser. Oh, and while I am talking about neat tricks: You can put vector graphics in your haddock documentation, as I have done for Optimisation.CirclePacking, using the syntax <<<data:image/svg+xml;base64,PD94bWwgdmV...c3ZnPg==>>.

12 September 2012

Joachim Breitner: Haskell Bytes

Last Saturday I have held a talk at the Meta Main-Rhein Chaosdays 2012 in Darmstadt, an event of the local Chaos Computer Club groups and not unsimilar to Karlsruhe s yearly GPN. I was asked to give an talk on a advanced, Haskell related topic, so I took my audience on a guided tour through the memory of a running Haskell program, explained the different types of closures and their layout, and showed how info tables are used to allow the garbage collector to uniformly treat the objects. We also used that knowledge to roughly predict the memory footprint of a program processing a large file using the String data type. Besides showing raw bytes (using my ghc-heap-view package and a small utility function) I included two visualizations: One is the video of a copying garbage collector at work that I blogged about last week, and the other is a nice tool that a student of mine creates for his bachelor thesis: It allows you to visualize the heap structure of a (potentially partially evaluated) value from GHCi and interactively evaluate parts of the structure if you want, even thunks that are hidden behind other thunks, something that you cannot do from Haskell code. The following picture is a screen shot of ghc-vis visualizing the evaluation of the famous two-line primes definition:
ghc-vis at work
The slides of my talk do not cover everything that was included, but the transcript does. It is in German, so as before, if you want me to translate it, then (convince your professor or employer to) invite me to hold the talk again. The ghc-vis tool can be found on Hackage, more information about it is on the author s website.
Flattr this post

10 June 2012

Philipp Kern: s390x accepted as release architecture

Yay, so we made it: s390x got added as a release architecture. What this means:
This will also help other 64bit big-endian ports (like powerpc64 and sparc64) to enter the archive more easily, as most issues left are indeed related to endianness, not to specialities of the System z hardware.
Many thanks go to Aur lien Jarno, without whom this would not have been possible. I also want to take this opportunity to thank all our s390(x) machine sponsors: ZIVIT, IIC@KIT and Marist College. There are not many mainframe owners who let free software projects work on their machines.

10 January 2012

Andrew Pollock: [life] Breaking and entering, with permission

I had a bit of an adventure yesterday, which would have taken some explaining if the police had gotten involved. It went a little something like this... My friend and former co-worker Sara was in the US Virgin Islands for the holidays. Her boyfriend, Karl, flew there separately for the tail end of her time there. Yesterday, I received a phone call from Sara, saying that Karl had managed to fly out to the Virgin Islands without his passport. Apparently you can get there without one, but to get back into the mainland US, you need one. She wanted to know if I could get one of my lock-picking co-workers to break into their apartment and retrieve Karl's passport and mail it them. Karl was supposed to fly out the next day. Attempts by Sara to contact her landlord had failed, so they didn't have many other options (apart from mailing me a key, which would have cost them another day). I asked one of my co-workers, Jason, who I knew was into lock picking, if he was up for it, and he offered to put me in touch with another guy who had dominated the recent lock picking night that he'd run. So now I'm talking to David, who's on board with the mission, but doesn't have his lock picking gear on him. No problem, Jason says he'll lend me his, which was at work with him. So we have a plan. Our friends Ian and Melinda are currently in Australia. They've lent us their car because it's leased, and they have some minimum mileage they're supposed to do and they're under it, so I've been driving to work in their car some days. As it happens, I drove to work in it yesterday. So now David and I set out in a car that neither of us own, with a lock picking set that belongs to another person, to break into an apartment of someone who's in the Virgin Islands. What could possibly go wrong? I'm told that it's not illegal to own a lock picking set, but if you're caught with one on your person and you're not a locksmith, you can get into all sorts of trouble. On top of that I'd have a hard time explaining the car I'm driving. We get to Sara and Karl's condo complex. It has a common gate that visitors would normal get buzzed through. Turns out it's not that hard to climb over. It's got some benign-looking spiky things on top, but I could get a leg over from the left hand side of the gate and jump over without impaling myself. Then I let David in and we proceeded upstairs to Sara and Karl's apartment door, where David set to work. Sara said that just the dead bolt was locked. David started at it with Jason's tools, trying to be as discreet as possible. It was about 3:30pm and there was no one around, but we could hear some noises from the neighbouring apartment (the two front doors were right next to each other). After what felt like about half an hour without success (the last pin of the lock was particularly tricky apparently) David was having to resort to more noisy techniques with the lock, so I decided to take the up-front approach and just inform the next door neighbour what we were doing in case he/she (I think it was a she) decided to call the cops on us. I told her through the door why we were there and what we were doing. She didn't seem to care too much. David then proceeded to start "raking" the lock, essentially brute forcing the pins with a lot of jiggling, and finally managed to pick it and we were in. I quickly found Karl's passport where it was suspected to be, and then we pondered how we were going to lock the door again. We could have just locked the door knob instead of the deadbolt and closed the door behind us, but we weren't sure if Sara and Karl had a key to the doorknob (Sara said they always just locked the deadbolt). Sara was fine with leaving the door unlocked until they got home, but weren't so keen on leaving our fingerprints all over the place and then leaving the door unlocked. David tried to re-pick the deadbolt so that he could lock it via the same means as opened it, and I scouted around for a key. I managed to find a key that locked both the deadbolt and the doorknob, so I took that with us and locked up their apartment. In David's defense, the deadbolt was a bit stiff to lock even with the key. I dropped David back at work, collected my stuff (it was now about 4:30pm) and headed to the UPS Store to ship Karl's passport to him as fast as humanly possible. I just made the 5pm pick up. Today I received an SMS from Sara informing me that they had received the passport. I was very impressed with how fast it got to them. So that was all a bit of an adventure. I'm not sure how much longer Karl is going to have to stay in the Virgin Islands as a result. I'm going to suggest that Sara and Karl leave a spare key with someone in future.

5 December 2011

Philipp Kern: New Debian buildd at Karlsruhe Institute of Technology

It took quite a lot of effort to persuade all decision makers to make this happen, but here it is: A new Debian buildd is being hosted at Karlsruhe Institute of Technology, to support the s390(x) ports. Its name is zemlinsky. So we've got some redundancy now and despite them being some sort of fringe architectures, they're looking pretty good. s390x is currently bootstrapped in the archive and it's progressing pretty quick. This new fast builder is one of the reasons why the slope is so steep.

Pointing people at the Debian Machine Usage Policies (DMUP) is pretty helpful to get a consent, with relation to network usage and acceptable use of the machines themselves. In this case the hardest part was drafting a user agreement that allows other non-university persons to log into the box, which is crucial to have it maintained by the Debian System Administrators.

Thanks to all the people at IPD Reussner, Steinbuch Centre for Computing and BelW who helped me getting this done.

1 December 2011

Joachim Breitner: Poetry in the problem class

I m currently running the problem class for the graph theory course at the Karlsruhe Institute of Technology, held by Prof. Maria Axenovich. On one problem sheet, I felt like setting the question in verse, and indeed, two students submitted their solution is in verse form as well. I have assembled the problem and both solution enjoy!
Flattr this post

2 September 2011

Asheesh Laroia: Debian bug squashing party at SIPB, MIT


(Photo credit: Obey Arthur Liu; originally on Picasa, license.) Three weekends ago, I participated in a Debian bug squashing party. It was more fun than I had guessed! The event worked: we squashed bugs. Geoffrey Thomas (geofft) organized it as an event for MIT's student computing group, SIPB. In this post, I'll review the good parts and the bad. I'll conclude with beaming photos of my two mentees and talk about the bugs they fixed. So, the good:

The event was a success, but as always, there are some things that could have gone more smoothly. Here's that list: Still, it turned out well! I did three NMUs, corresponding to three patches submitted for release-critical bugs by my two mentees. Those mentees were: Jessica enjoying herself Jessica McKellar is a software engineer at Ksplice Oracle and a recent graduate of MIT's EECS program. She solved three release-critical bugs. This was her first direct contribution to Debian. In particular: Jessica has since gotten involved in the Twisted project's personal package archive. Toward the end of the sprint, she explained, "I like fixing bugs. I will totally come to the next bug squashing party." Noah grinning Noah Swartz is a recent graduate of Case Western Reserve University where he studied Mathematics and played Magic. He is an intern at the MIT Media Lab where he contributes to DoppelLab in Joe Paradiso's Responsive Environments group. This was definitely his first direct contribution to Debian. It was also one of the most intense command-line experiences he has had so far. Noah wasn't originally planning to come, but we were having lunch together before the hackathon, and I convinced him to join us. Noah fixed #625177, a fails-to-build-from-source (FTBFS) bug in nslint. The problem was that "-Wl" was instead written in all lowercase in the debian/rules file, as "-wl". Noah fixed that, making sure the package properly built in pbuilder, and then spent some quality time with lintian figuring out the right way to write a debian/changelog. That's a wrap! We'll hopefully have one again in a few months, and before that, I hope to write up a guide so that we run things even more smoothly next time.

10 June 2011

Rapha&#235;l Hertzog: People behind Debian: Philipp Kern, Stable Release Manager

Philipp is a Debian developer since 2005 and a member of the release team since 2008. Since he took the responsibility of Stable Release Manager, the process has evolved for the best. I asked him to explain how the release team decides what s fit for stable or not. His work on the buildd infrastructure is also admirable but I ll let him describe that. My questions are in bold, the rest is by Philipp. Who are you? I m a 24 year old Computer Science student, living in Karlsruhe, Germany. As a student assistant I m currently in two jobs: one is taking care of getting our university IPv6 network in shape, or rather generally getting fringe technologies into the network. The second is taking care of a s390x machine in the basement which my faculty got sponsored recently. In my spare time I tend my Debian duties and I m active in the student council (Fachschaft) as a sys-admin, software developer and until recently as treasurer. I got accepted as a Debian Developer in 2005. I m only really active since I was invited to join the Release Team in early 2008 after I contributed rewrites of some scripts that got lost in a disk crash of ftp-master. In late 2008 I also took on wanna-build/buildd duties. What s your notable achievement within Debian? For one I worked on the deployment of a single buildd/sbuild combination over all of our buildds and I rebased it on the sbuild in the archive several times already. All buildds basically look the same on all architectures, with only few variations. The chroots are now also created in a mostly predictable manner. Then we finally got build autosigning after years of constant poking. However the policy decision to allow it was made by the ftp-masters anyway, for which I m grateful, as it eases the workload of the buildd maintainers, the Security Team and the Release Team quite a bit. On the release side there s the integration of volatile into the proper structures of Debian. But there I m guilty for choosing the bad name squeeze-updates is, in comparision with security s squeeze/updates. Let s see if we can improve that for wheezy. The policy for stable updates have changed over time. Can you summarize what kind of updates are allowed nowadays? All the changes that were made to the policy were driven by the aim to keep stable (and to some lesser degree oldstable) usable throughout its lifetime. I have to admit that we got slightly more liberal in what we accept since I took over. The previous stable update policy included the fixes of critical bugs that break the system in interesting ways and security fixes. We also opened up the possibility of including important fixes for annoying bugs on a case-by-case basis. The whole don t update that much part of the stable release management is rooted in let s don t change the behaviour of a running system and let s have as few regressions as possible . We currently try to counter that with a review process that only allows self-contained fixes that were tested in unstable first, if applicable. In the future I d also like to have a feedback process that includes the users of the packages to report problems more early. In fact that s also why we now send calls for testing to debian-stable-announce one week in advance (see this mail as an example). So stable is updated through point releases that happen roughly every two months for stable and roughly every four for oldstable. They are accompanied by announcements and new CD/DVD sets. To be able to push some updates with less delay to our users and to avoid that they have to pick them from proposed-updates, there is stable-updates. The current policy for this suite is available in this post on debian-devel-announce. It boils down to everything urgent like tzdata, regression fixes and volatile packages. We noticed that some packages in stable just weren t useful anymore when they got updated through volatile. Hence those are folded into stable, too. We also relaxed the rules a bit so that leaf packages like clive, that rely on external websites not to change their URLs, can be updated too. If somebody wonders what happened to the and a half releases Those were intended to mark the inclusion of new hardware support (mostly by including a co-installable new kernel version). This kind of flag point release is thankfully no longer needed because our marvellous kernel team now backports certain hardware drivers to the kernel version in stable. They have some trouble soliciting test feedback for them, though. It would be cool if more people using stable could respond to their calls for testing. What are your plans for Debian Wheezy? Luckily the wanna-build/buildd side isn t much dependent on release cycles. The stable release management also just takes care of what gets dumped on them by the testing RMs. ;-) The near-term goals are certainly: Apart from that my work is mainly squashing the bugs on buildd.debian.org for the wanna-build part, so if you want something fixed, you need to report it, I m afraid. You have several roles within Debian, in particular stable release manager and member of the wanna-build team. Which role do you enjoy more and why? So the regular duties of the SRM, apart from setting up policies, are reviewing requests for uploads to proposed-updates, accepting them, chasing builds and scheduling point releases. It s mainly steady work. As for the Release Team it s working in a fabulous team where everyone s doing an awesome job. For me the wanna-build role is holding the things together and fixing up stuff that breaks while the world continues to evolve. And occassionally taking a stab at the ftp-masters. So rather different. My heart is probably siding with the buildd side because it s infrastructure but release work is fun, too. What s the biggest problem of Debian? Mainly that we re not quick in picking up new technologies and that we got slow on innovation. Distro-wide changes are usually a very big effort to coordinate and to finally complete. I guess people should be able to push fixes for release goals more eagerly into the archive. Also it seems to me that many DDs are actually disengaging from the release process, especially during the freeze but also before, which makes me a bit sad. However I m still not sure how we could counter that. As we re all volunteers you can t suddenly make others love the idea of releasing something together, I guess. Do you have wishes for Debian Wheezy? It s getting old, but there s certainly multiarch. Also a usable Python 3 is something I d really appreciate. Is there someone in Debian that you admire for their contributions? Peter Palfrader did an awesome job of getting a nicely working team together for DSA and got many tasks as automated as they are now. It s a pleasure to work with them and setting up a new buildd is, for example, a real breeze. He also developed a sane archive snapshot service, kudos for that.
Thank you to Philipp for the time spent answering my questions. I hope you enjoyed reading his answers as I did. Subscribe to my newsletter to get my monthly summary of the Debian/Ubuntu news and to not miss further interviews. You can also follow along on Identi.ca, Twitter and Facebook.

3 comments Liked this article? Click here. My blog is Flattr-enabled.

23 November 2010

Biella Coleman: An unlikely story about a pit bull attack, free software, and a New York Times reporter

If I told you that in the last two days, I have been caught in a vortex of coincidence, a vortex composed of pit bulls, free software, diaspora (the software), mold, and a New York Times reporter, I bet you would think not likely. So the story started on Jet Blue, which offers snacks, lots of them, and Direct TV. Since I don t have TV I kinda go on a binge, watching all sorts of shows as I make my way home. I watched a pretty distributing but interesting documentary on Jim Jones on CNN and a show on Animal Planet on pit bulls and parolees. When I rolled into my my current digs in northern Manhattan (I am currently banished from my downtown apt due to mold, but that is a whole other story), there was a dinner party well underway. At some point in the evening prompted by me, we talked pit bulls as my friends want to get one but their family has issued a threat of disavowal if they do. The next morning, I was scoping out the website for the Animal Planet show as I was intrigued by it and frankly I kinda like pit bulls (maybe less now although I think they are unfairly maligned). Five minutes into pursuing the site, I hear screeches from hell. It sounds like a woman is being attacked. And she is. A woman right outside of my window was being attacked by a pit bull. So I am staying with a friend, an open source developer, Karl Fogel and good soul that he is, he runs out to help the lady (since I have been subject to 5 weeks of sickness due to mold or so that is what we think it is was enough for me; I could not stomach the idea of getting bit so I played the role of concerned spectator). It took minutes upon minutes, really just too many minutes to get the pit bull off, even a brick pounded against his head failed (apparently, a cigarette or match held to the throat does the trick, which I found out later). Eventually, the dog was extracted, a huge team of cops showed up, the dog was whisked away, the victim taken to the hospital, and life returned to calm and quiet. So the next day, I was being interviewed by a New York Times reporter Jim Dwyer who wrote a story about Diaspora for the New York Times back in the summer, helping to propel it from relative obscurity to near insta-fame (one of the Diaspora developers, Max, was my student). We were running out of time (I had another appointment) so I asked him if he lived in northern Manhattan as that is what his bio page indicates. He confirmed, I explained I was up there and that we could meet up there later to finish up. He inquired what part, I told him roughly where I was, he remarked he was near there, and so naturally I told him about the crazy pit bull attack I witnessed from my window as I can t shut my trap when it comes to things like that. Well yes you know what is coming next next: he was there, helping Karl (and others) deal with the pit bull attack. He lives nearby and heard the shrieks of agony and came out to aid. All and all it was pretty horrific. He also met Karl in so far as Karl gave him his phone number and email just in case he was needed as a witness (Karl had to dash off to catch a plane). Well, the funny thing, or as you also might guess: Jim, who is doing some more writing on tech, free software etc, should really talk to Karl given his key role in the community, so they already met, although under odd and terrible circumstances. I am not sure if I am more wigged out by the fact that I was reading about pit bulls when the attack happened or whether the reporter I was interviewed by was there along side with a free software developer he really needs to interview. Whatever the case, I kinda hope the vortex of coincidence now leaves me to hit someone else (sans any horrible attack). Or else, as Karl noted in the blog comments, I will have to be very careful about what shows I watch:
Amen to that! Enough with the coincidence vortex. As I said to Biella in IRC later: Do us a favor don t watch any shows about nuclear attacks on New York, okay

16 November 2010

Joachim Breitner: Student research project on Shivers cntrol flow algorithm

As my student research project I worked on a formalization of Olin Shiver s control flow algorithms for functional languages (as written down in his dissertation) in the theorem prover Isabelle. I just handed in my report to my supervisor Andreas Lochbihler. I have also submitted the Isabelle theories to the archive of formal proofs and uploaded the Haskell prototype on Hackage. The complete code is in a git repository.

3 November 2010

Dirk Eddelbuettel: Rcpp 0.8.8

A bug-fix release 0.8.8 of Rcpp is now available. It is awaiting processing at CRAN, and will be uploaded to Debian once processed at CRAN. In the meantime, sources are available from my local directory here. This release follows on the heels of 0.8.7, but contains fixes for a few small things Romain and I had noticed over the last two weeks since releasing 0.8.7 and contains only a small number of new tweaks. The NEWS entry follows below:
0.8.8   2010-11-01
    o   New syntactic shortcut to extract rows and columns of a Matrix. 
        x(i,_) extracts the i-th row and x(_,i) extracts the i-th column. 
    
    o   Matrix indexing is more efficient. However, faster indexing is
        disabled if g++ 4.5.0 or later is used.
    o   A few new Rcpp operators such as cumsum, operator=(sugar)
    o   Variety of bug fixes:
        - column indexing was incorrect in some cases
        - compilation using clang/llvm (thanks to Karl Millar for the patch)
        - instantation order of Module corrected
        - POSIXct, POSIXt now correctly ordered for R 2.12.0 
As always, even fuller details are on the Rcpp Changelog page and the Rcpp page which also leads to the downloads, the browseable doxygen docs and zip files of doxygen output for the standard formats. A local directory has source and documentation too. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page

Next.

Previous.